📚 node [[n gram|n gram]]
Welcome! Nobody has contributed anything to 'n gram|n gram' yet. You can:
  • Write something in the document below!
    • There is at least one public document in every node in the Agora. Whatever you write in it will be integrated and made available for the next visitor to read and edit.
  • Write to the Agora from social media.
    • If you follow Agora bot on a supported platform and include the wikilink [[n gram|n gram]] in a post, the Agora will link it here and optionally integrate your writing.
  • Sign up as a full Agora user.
    • As a full user you will be able to contribute your personal notes and resources directly to this knowledge commons. Some setup required :)
⥅ related node [[neuro linguistic programming]]
⥅ related node [[2003 07 22 wired news tired of the telly reprogram it]]
⥅ related node [[2010 08 01 were not teaching programming and we should be]]
⥅ related node [[2011 04 11 how to play instagram]]
⥅ related node [[2020 03 14 on instagram off facebook]]
⥅ related node [[meditation for programmers]]
⥅ related node [[concepts techniques and models of computer programming]]
⥅ related node [[daniel ingram]]
⥅ related node [[daniel m ingram]]
⥅ related node [[pattern languages of programs]]
⥅ related node [[resources to learn to program]]
⥅ related node [[structure and interpretation of computer programs]]
⥅ related node [[tindallgrams]]
⥅ related node [[certification build ai powered chatbots without programming (ibm)]]
⥅ related node [[n gram]]
⥅ related node [[lionel shriver on grammar]]
⥅ related node [[bring back instagram]]
⥅ related node [[instagram]]
⥅ related node [[2020 09 07 instagram disabled]]
⥅ related node [[a new grammar of organization]]
⥅ related node [[an iterative enquiry diagram for digital ecosocialism]]
⥅ related node [[an iterative inquiry diagram for digital ecosocialism]]
⥅ related node [[antonio gramsci]]
⥅ related node [[google joins samsung in working with ifixit on a self repair program]]
⥅ related node [[linear programming]]
⥅ related node [[plantuml for weeknote diagrams]]
⥅ related node [[samsung is working on a galaxy self repair program with ifixit]]
⥅ related node [[stock and flow diagram]]
⥅ related node [[structural adjustment program]]
⥅ related node [[systems thinking diagrams in plantuml]]
⥅ related node [[the uk should have a national programme of home insulation]]
⥅ related node [[obsolescencia programada]]
⥅ related node [[20200604222704 functional_programming]]
⥅ related node [[20200629133954 nix_programming_language]]
⥅ related node [[20210311213212 object_oriented_programming]]
⥅ related node [[20210423171911 concurrent_programming]]
⥅ related node [[20210516174118 context_free_grammar]]
⥅ related node [[20210521211721 function_programming]]
⥅ related node [[20210607154216 pure_functional_programming]]
⥅ related node [[20210503215708 7_lines_of_code_3_minutes_implement_a_programming_language_from_scratch]]
⥅ node [[n-gram]] pulled by Agora

N-gram

Go back to the [[AI Glossary]]

#seq

An ordered sequence of N words. For example, truly madly is a 2-gram. Because order is relevant, madly truly is a different 2-gram than truly madly.

N Name(s) for this kind of N-gram Examples
2 bigram or 2-gram to go, go to, eat lunch, eat dinner
3 trigram or 3-gram ate too much, three blind mice, the bell tolls
4 4-gram walk in the park, dust in the wind, the boy ate lentils

Many natural language understanding models rely on N-grams to predict the next word that the user will type or say. For example, suppose a user typed three blind. An NLU model based on trigrams would likely predict that the user will next type mice.

Contrast N-grams with bag of words, which are unordered sets of words.

📖 stoas
⥱ context